Low Rank Regularization: A review
نویسندگان
چکیده
Low Rank Regularization (LRR), in essence, involves introducing a low rank or approximately assumption to target we aim learn, which has achieved great success many data analysis tasks. Over the last decade, much progress been made theories and applications. Nevertheless, intersection between these two lines is rare. In order construct bridge practical applications theoretical studies, this paper provide comprehensive survey for LRR. Specifically, first review recent advances issues that all LRR models are faced with: (1) rank-norm relaxation, seeks find relaxation replace minimization problem; (2) model optimization, use an efficient optimization algorithm solve relaxed models. For issue, detailed summarization various functions conclude non-convex relaxations can alleviate punishment bias problem compared with convex relaxations. second summarize representative algorithms used previous analyze their advantages disadvantages. As main goal of promote application relaxations, conduct extensive experiments compare different functions. The experimental results demonstrate generally large advantage over Such result inspiring further improving performance existing
منابع مشابه
Low Rank Priors for Color Image Regularization
In this work we consider the regularization of vectorial data such as color images. Based on the observation that edge alignment across image channels is a desirable prior for multichannel image restoration, we propose a novel scheme of minimizing the rank of the image Jacobian and extend this idea to second derivatives in the framework of total generalized variation. We compare the proposed co...
متن کاملConvolutional neural networks with low-rank regularization
Large CNNs have delivered impressive performance in various computer vision applications. But the storage and computation requirements make it problematic for deploying these models on mobile devices. Recently, tensor decompositions have been used for speeding up CNNs. In this paper, we further develop the tensor decomposition technique. We propose a new algorithm for computing the low-rank ten...
متن کاملA Low-rank Tensor Regularization Strategy for Hyperspectral Unmixing
Tensor-based methods have recently emerged as a more natural and effective formulation to address many problems in hyperspectral imaging. In hyperspectral unmixing (HU), low-rank constraints on the abundance maps have been shown to act as a regularization which adequately accounts for the multidimensional structure of the underlying signal. However, imposing a strict low-rank constraint for the...
متن کاملKernel Locality Preserving Low-Rank Representation with Tikhonov Regularization
Classification based on Low-Rank Representation (LRR) has been a hot-topic in the field of pattern classification. However, LRR may not be able to fuse the local and global information of data completely and fail to represent nonlinear samples. In this paper, we propose a kernel locality preserving low-rank representation with Tikhonov regularization (KLP-LRR) for face recognition. KLP-LRR is a...
متن کاملBootstrap-Based Regularization for Low-Rank Matrix Estimation
We develop a flexible framework for low-rank matrix estimation that allows us to transform noise models into regularization schemes via a simple bootstrap algorithm. Effectively, our procedure seeks an autoencoding basis for the observed matrix that is stable with respect to the specified noise model; we call the resulting procedure a stable autoencoder. In the simplest case, with an isotropic ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Networks
سال: 2021
ISSN: ['1879-2782', '0893-6080']
DOI: https://doi.org/10.1016/j.neunet.2020.09.021